Cell Proliferation
○ Wiley
Preprints posted in the last 7 days, ranked by how well they match Cell Proliferation's content profile, based on 12 papers previously published here. The average preprint has a 0.01% match score for this journal, so anything above that is already an above-average fit.
Zhang, Q.; Tang, Q.; Vu, T.; Pandit, K.; Cui, Y.; Yan, F.; Wang, N.; Li, J.; Yao, A.; Menozzi, L.; Fung, K.-M.; Yu, Z.; Parrack, P.; Ali, W.; Liu, R.; Wang, C.; Liu, J.; Hostetler, C. A.; Milam, A. N.; Nave, B.; Squires, R. A.; Battula, N. R.; Pan, C.; Martins, P. N.; Yao, J.
Show abstract
End-stage liver disease (ESLD) is one of the leading causes of death worldwide. Currently, the only curative option for patients with ESLD is liver transplantation. However, the demand for donor livers far exceeds the available supply, partly because many potentially viable livers are discarded following biopsy evaluation. While biopsy is the gold standard for assessing liver histological features related to graft quality and transplant suitability, it often leads to high discard rates due to its susceptibility to sampling errors and limited spatial coverage. Besides, biopsy is invasive, time-consuming, and unavailable in clinical facilities with limited resources. Here, we present an AI-assisted photoacoustic/ultrasound (PA/US) imaging framework for quantitative assessment of human donor liver graft quality and transplant suitablity at the whole-organ scale. With multimodal volumetric PA/US images as the input, our deep-learning (DL) model accurately predicted the risk level of fibrosis and steatosis, which indicate the graft quality and transplant suitability, when comparing with true pathological scores. DL also identified the imaging modes (PAI wavelength and B-mode USI) that correlated the most with prediction accuracy, without relying on ill-posed spectral unmixing. Our method was evaluated in six discarded human donor livers comprising sixty spatially matched regions of interest. Our study will pave the way for a new standard of care in organ graft quality and transplant suitability that is fast, noninvasive, and spatially thorough to prevent unnecessary organ discards in liver transplantation.
Wang, L.; Yang, Y.; Ng, T. K.; Chen, J.; Sun, X.
Show abstract
Purpose: To identify the ocular biometric parameters associated with refractive outcomes in Chinese Primary angle closure glaucoma (PACG) patients receiving phacoemulsification and intraocular lens (IOL) implantation (PEI) surgery. Methods: 165 Chinese PACG patients receiving PEI and goniosynechialysis (GSL) and 53 cataract patients as controls only receiving PEI surgery were recruited. The prediction accuracy of IOL power calculation was assessed by the prediction error (PE), mean absolute error (MAE), median absolute error (MedAE), and proportions of eyes with a PE within {+/-} 0.25 diopters (D), {+/-} 0.50 D, {+/-} 0.75 D, and {+/-} 1.00 D. The association of different ocular biometric parameters with the PE of IOL calculation were evaluated. Results: The PACG patients had significantly higher absolute of PE as compared to the control subjects, especially the acute PACG patients. The axial length (AL), changes in aqueous depth pre- and post-surgery ({bigtriangleup}AD), and the ratio of {bigtriangleup}AD/AL were significantly associated with the PE in acute PACG patients. The association of {bigtriangleup}AD with the PE of IOL power calculation was found in PACG patients with AL [≥] 22 mm. Conclusions: This study revealed the association of AL and {bigtriangleup}AD with the PE of IOL calculation in Chinese PACG patients. Precisely predict the {bigtriangleup}AD is necessary for acute PACG patients, especially for those with AL [≥] 22 mm, to improve the refractive outcomes.
Hoskins, J. W.; Christensen, T. A.; Eiser, D.; Char, E.; Mobaraki, M.; O'Brien, A.; Collins, I.; Zhong, J.; Patel, M. B.; Prasad, G.; Pancreatic Cancer Cohort Consortium and Pancreatic Cancer Case-Control Consortium (PanScan/PanC4), ; Arda, E.; Connelly, K. E.; Amundadottir, L. T.
Show abstract
Pancreatic ductal adenocarcinoma (PDAC) remains one of the deadliest human cancers. The current largest published PDAC Genome-Wide Association Study (GWAS) identified 23 genetic risk signals, but most lack sufficient characterization. This study aimed to functionally characterize the chr13q12.2 (PLUT/PDX1) PDAC GWAS risk locus. Fine-mapping, luciferase reporter assays, and electrophoretic mobility shift assays implicated rs9581943, a PDX1 promoter SNP, as a functional variant underlying this GWAS signal. GTEx expression QTL analyses identified rs9581943 as a significant PDX1 eQTL in pancreas, and CRISPR/Cas9 editing in PDAC-derived cell lines confirmed a functional relationship. PDX1 is a transcription factor involved in early pancreas development and {beta}-cell homeostasis, but its role in exocrine pancreatic cells is unclear. Single-nucleus RNA-seq analyses of pancreatic acinar and ductal cells from neonatal, adult, and chronic pancreatitis donors suggested PDX1 activity alleviates high secretory load and ER-stress in acinar and biases ducts toward homeostatic phenotypes. Similarly, scRNA-seq analyses of pancreatic tumors suggested PDX1 activity reduces biosynthetic and inflammatory stress and promotes epithelial differentiation. Our study therefore implicates rs9581943 as a causal variant for the chr13q12.2 PDAC GWAS signal wherein the risk allele reduces PDX1 expression, eroding PDX1's capacity to buffer stress and stabilize epithelial cell fate in the exocrine compartment.
Wang, X.-Y.; Li, M.-M.; Zhao, S.-M.; Jia, X.-Y.; Yang, W.-S.; Chang, L.-L.; Wang, H.-M.; Zhao, J.-T.
Show abstract
Stroke-associated pneumonia (SAP) is a common, severe complication in acute ischemic stroke (AIS) patients receiving bridging therapy (intravenous thrombolysis + mechanical thrombectomy), worsening prognosis and increasing mortality. Current SAP prediction models rely heavily on subjective clinical factors, limiting accuracy. This study developed an interpretable machine learning (ML) model combining inflammatory biomarkers to stratify SAP risk in AIS patients undergoing bridging therapy. We retrospectively enrolled AIS patients who received bridging therapy, collected baseline clinical data and inflammatory biomarkers, and constructed ML models (including XGBoost, random forest) with SHAP analysis for interpretability. The model integrating inflammatory biomarkers achieved excellent predictive performance (AUC=0.XX, 95%CI: XX-XX), outperforming traditional clinical models. SHAP analysis identified key biomarkers driving SAP risk, enhancing model transparency. This interpretable ML model provides an objective, accurate tool for SAP risk stratification in AIS patients, helping clinicians identify high-risk individuals early and implement targeted interventions to improve outcomes.
Jacobsen, A. M.; Quednow, B. B.; Bavato, F.
Show abstract
ImportanceBlood neurofilament light chain (NfL) and glial fibrillary acidic protein (GFAP) are entering clinical use in neurology as markers of neuroaxonal and astrocytic injury, but their utility in psychiatry is unclear. ObjectiveTo determine whether psychiatric diagnoses are associated with altered plasma NfL and GFAP levels. Design, Setting, and ParticipantsThis population-based study examined plasma NfL and GFAP among 47,495 participants from the UK Biobank (54.0% female; 93.5% White; mean [SD] age 56.8 [8.2] years) who provided blood samples and sociodemographic and clinical data between 2006 and 2010. Normative modeling was applied to assess associations between 7 lifetime psychiatric diagnostic categories and deviations from expected NfL and GFAP levels, while accounting for neurological diagnoses, cardiometabolic burden, and substance use. Data were analyzed between July 2025 and March 2026. Main Outcomes and MeasuresDeviations in plasma NfL and GFAP levels from normative predictions. ResultsRelative to the reference population, plasma NfL levels were higher among individuals with bipolar disorder (d=0.20; 95% CI, 0.03-0.37; p=0.03), recurrent depressive disorder (d=0.23; 95% CI, 0.07-0.38; p=0.009), and depressive episodes (d=0.06; 95% CI, 0.02-0.10; p=0.01), lower among individuals with anxiety disorders (d=-0.07; 95% CI, -0.12 to -0.02; p=0.008), but did not differ in schizophrenia spectrum, stress-related, or other psychiatric disorders. Plasma GFAP levels were not elevated in any psychiatric disorders. Variability in NfL levels was greater among individuals with schizophrenia spectrum disorders (variance ratio [VR]=1.30; p=0.005), depressive episodes (VR=1.06; p=0.006), and anxiety disorders (VR=1.08; p=0.005). Variability in GFAP levels was increased only in anxiety disorders (VR=1.08; p=0.01). Plasma NfL levels exceeding percentile-based normative thresholds were more common among individuals with schizophrenia spectrum disorders, bipolar disorder, recurrent depressive disorder, and depressive episodes. Neurological diagnoses, cardiometabolic burden, and substance use were associated with plasma NfL and GFAP levels. Conclusions and RelevanceThis study provides population-level evidence of plasma NfL elevation in bipolar and depressive disorders and increased variability in schizophrenia spectrum, bipolar and depressive disorders, supporting its potential as a biomarker in psychiatry and informing its ongoing neurological applications. Plasma GFAP levels, in contrast, were largely unaltered across psychiatric disorders. Key PointsO_ST_ABSQuestionC_ST_ABSAre plasma neurofilament light chain (NfL) and glial fibrillary acidic protein (GFAP) levels altered in psychiatric disorders? FindingsIn this cohort study including 47,495 individuals, normative modeling revealed that plasma NfL levels were elevated in bipolar and depressive disorders, whereas plasma GFAP levels were not elevated in any psychiatric disorder. Plasma NfL levels also showed higher variability in schizophrenia spectrum, bipolar, and depressive disorders. MeaningPlasma NfL shows distinct alterations in schizophrenia spectrum and affective disorders, supporting its further investigation as a biomarker in clinical psychiatry and highlighting the need to consider psychiatric comorbidity in neurological applications.
Harikumar, A.; Baker, B.; Amen, D.; Keator, D.; Calhoun, V. D.
Show abstract
Single photon emission computed tomography (SPECT) is a highly specialized imaging modality that enables measurement of regional cerebral perfusion and, in particular, resting cerebral blood flow (rCBF). Recent technological advances have improved SPECT quantification and reliability, making it increasingly useful for studying rCBF abnormalities and perfusion network alterations in psychiatric and neurological disorders. To characterize large scale functional organization in SPECT data, data driven decomposition methods such as independent component analysis (ICA) have been used to extract covarying perfusion patterns that map onto interpretable brain networks. Blind ICA provides a data driven approach to estimate these networks without strong prior assumptions. More recently, a hybrid approach that leverages spatial priors to guide a spatially constrained ICA (sc ICA) have been used to fully automate the ICA analysis while also providing participant-specific network estimates. While this has been reliably demonstrated in fMRI with the NeuroMark template, there is currently no comparable SPECT template. A SPECT template would enable automatic estimation of functional SPECT networks with participant-specific expressions that correspond across participants and studies. The current study introduces a new replicable NeuroMark SPECT template for estimating canonical perfusion covariance patterns (networks). We first identify replicable SPECT networks using blind ICA applied to two large sample SPECT datasets. We then demonstrate the use of the resulting template by applying sc-ICA to an independent schizophrenia dataset. In sum, this work presents and shares the first NeuroMark SPECT template and demonstrating its utility in an independent cohort, providing a scalable and robust framework for network-based analyses.
Xu, M.; Philips, R.; Singavarapu, A.; Zheng, M.; Martin, D.; Nikolin, S.; Mutz, J.; Becker, A.; Firenze, R.; Tsai, L.-H.
Show abstract
Background: Gamma oscillation dysfunction has been implicated in neuropsychiatric disorders. Restoring gamma oscillations via brain stimulation represents an emerging therapeutic approach. However, the strength of its clinical effects and treatment moderators remain unclear. Method: We conducted a systematic review and meta-analysis to examine the clinical effects of gamma neuromodulation in neuropsychiatric disorders. A literature search for controlled trials using gamma stimulation was performed across five databases up until April 2025. Effect sizes were calculated using Hedge's g. Separate analyses using the random-effects model examined the clinical effects in schizophrenia (SZ), major depressive disorder (MDD), bipolar disorder, and autism spectrum disorder. For SZ and MDD, subgroup analyses evaluated the effects of stimulation modality, stimulation frequency, treatment duration, and pulses per session. Result: Fifty-six studies met the inclusion criteria (NSZ = 943, NMDD = 916, NBD = 175, NASD = 232). In SZ, gamma stimulation was associated with improvements in positive (k = 10, g = -0.60, p < 0.001), negative (k = 12, g = -0.37, p = 0.03), depressive (k = 8, g = -0.39, p < 0.001), anxious symptoms (k = 5, g = -0.59, p < 0.001), and overall cognitive function (k = 7, g = 0.55, p < 0.001). Stimulation frequency and treatment duration moderated therapeutic effects. In MDD, reductions in depressive symptoms were observed (k = 23, g = -0.34, p = 0.007). Conclusion: Gamma neuromodulation showed moderate therapeutic benefits in SZ and MDD. Substantial heterogeneity likely reflects protocol differences, highlighting the need for well-powered future trials.
Quide, Y.; Lim, T. E.; Gustin, S. M.
Show abstract
BackgroundEarly-life adversity (ELA) is a risk factor for enduring pain in youth and is associated with alterations in brain morphology and function. However, it remains unclear whether ELA-related neurobiological changes contribute to the development of enduring pain in early adolescence. MethodsUsing data from the Adolescent Brain Cognitive Development (ABCD) Study, we examined multimodal magnetic resonance imaging (MRI) markers in children assessed at baseline (ages 9-11 years) and at 2-year follow-up (ages 11-13 years). ELA exposure was defined at baseline to maximise temporal separation between early adversity and later enduring pain. Participants with enduring pain at follow-up (n = 322) were compared to matched pain-free controls (n = 644). Structural MRI, diffusion MRI (fractional anisotropy, mean diffusivity), and resting-state functional connectivity data were analysed. Linear models tested main effects of enduring pain, ELA, and their interaction on brain metrics, controlling for relevant covariates. ResultsELA exposure was associated with smaller caudate and nucleus accumbens volumes, and reduced surface area of the left rostral middle frontal gyrus. No significant effects of enduring pain or ELA-by-enduring pain interaction were observed across grey matter, white matter, or functional connectivity measures. ConclusionsELA was associated with alterations in fronto-striatal regions in late childhood, but these changes were not linked to enduring pain in early adolescence. These findings suggest that ELA-related neurobiological alterations may represent early markers of vulnerability rather than concurrent correlates of enduring pain. Longitudinal follow-up is needed to determine whether these alterations contribute to later chronic pain risk.
Spann, D. J.; Hall, L. M.; Moussa-Tooks, A.; Sheffield, J. M.
Show abstract
BackgroundNegative symptoms are core features of schizophrenia that relate strongly to functional impairment, yet interventions targeting these symptoms remain largely ineffective. Emerging theoretical work highlights how environmental factors may shape and maintain negative symptoms. Although racial disparities in schizophrenia diagnosis among Black Americans are well documented and linked to racial stress and psychosis, the impact of racial stress on negative symptoms has not been examined. This study provides an initial test of a novel theory proposing that racial stress - here measured by racial discrimination - influences negative symptom severity through exacerbation of negative cognitions about the self, particularly defeatist performance beliefs (DPB). Study DesignParticipants diagnosed with schizophrenia-spectrum disorder (SSD) (N = 208; 80 Black, 128 White) completed the Positive and Negative Syndrome Scale (PANSS), the Defeatist Beliefs Scale, and self-report measures of subjective racial and ethnic discrimination (Racial and Ethnic Minority Scale and General Ethnic Discrimination Scale). Relationships among variables were tested using linear regression and mediation analysis. Study ResultsBlack participants exhibited significantly greater total and experiential negative symptoms than White participants with no group difference in DPB. Racial discrimination explained 46% of the relationship between race and negative symptoms. Among Black participants, higher DPB were associated with greater negative symptom severity. Discrimination was positively related to both DPB and negative symptoms. DPB partially mediated the relationship between discrimination and negative symptoms. ConclusionsFindings suggest that racial stress contributes to negative symptom severity via defeatist beliefs among Black individuals, highlighting potential targets for culturally informed interventions.
Xu, J.; Parker, R. M. A.; Bowman, K.; Clayton, G. L.; Lawlor, D. A.
Show abstract
Background Higher levels of sedentary behaviour, such as leisure screen time (LST), and lower levels of physical activity are associated with diseases across multiple body systems which contribute to a large global health burden. Whether these associations are causal is unclear. The primary aim of this study is to investigate the causal effects of higher LST (given greater power) and, secondarily, lower moderate-to-vigorous intensity physical activity (MVPA), on a wide range of diseases in a hypothesis-free approach. Methods A two-sample Mendelian randomisation phenome-wide association study was conducted for the main analyses. Genetic single nucleotide polymorphisms (SNPs) were first selected as exposure genetic instruments for LST (hours of television watched per day; 117 SNPs) and MVPA (higher vs. lower; 18 SNPs) based on the genome-wide significant threshold (p < 5*10-8) from the largest relevant genome-wide association study (GWAS). For disease outcomes, we used summary results from FinnGen GWAS, including 1,719 diseases defined by hospital discharge International Classification of Diseases (ICD) codes in 453,733 European participants. For the main analyses, we used the inverse-variance weighting method with a Bonferroni corrected p-value of p [≤] 3.47*10-4. Sensitivity analyses included Steiger filtering, MR-Egger and weighted median analyses, and data from UK Biobank were used to explore replication. Findings Genetically predicted higher LST was associated with increased risk of 87 (5.1% of the 1,719) diseases. Most of these diseases were in musculoskeletal and connective tissue (n=37), genitourinary (n=12) and respiratory (n=8) systems. Genetic liability to lower MVPA was associated with six diseases: three in musculoskeletal and connective tissue and genitourinary systems (with greater risk of these diseases also identified with higher LST), and three in respiratory and genitourinary systems. Sensitivity analyses largely supported the main analyses. Results replicated in UK Biobank, where data available. Conclusions Higher levels of sedentary behaviour, and lower levels of physical activity, causally increase the risk of diseases across multiple body systems, making them promising targets for reducing multimorbidity.
Pietilainen, O.; Salonsalmi, A.; Rahkonen, O.; Lahelma, E.; Lallukka, T.
Show abstract
Objectives: Longer lifespans lead to longer time on retirement, despite the efforts to raise the retirement age. Therefore, it is important to study how the retirement years can be spent without diseases. This study examined socioeconomic and sociodemographic differences in healthy years spent on retirement. Methods: We followed a cohort of retired Finnish municipal employees (N=4231, average follow-up 15.4 years) on national administrative registers for major chronic diseases: cancer, coronary heart disease, cerebrovascular disease, diabetes, asthma or chronic obstructive pulmonary disease, dementia, mental disorders, and alcohol-related disorders. Median healthy years on retirement and age at first occurrence of illness (ICD-10 and ATC-based) in each combination of sex, occupational class, and age of retirement were predicted using Royston-Parmar models. Prevalence rates for each diagnostic group were calculated. Results: Most healthy years on retirement were spent by women having worked in semi-professional jobs who retired at age 60-62 (median predicted healthy years 11.6, 95% CI 10.4-12.7). The least healthy years on retirement were spent by men having worked in routine non-manual jobs who retired after age 62 (median predicted healthy years 6.5, 95% CI 4.4-9.5). Diabetes was slightly more common among lower occupational class women, and dementia among manual working women having retired at age 60-62. Discussion: Healthy years on retirement are not enjoyed equally by women and men and those who retire early or later. Policies aiming to increase the retirement age should consider the effects of these gaps on retirees and the equitability of those effects.
Hung, J.; Smith, A.
Show abstract
The global ambition to end the human immunodeficiency virus (HIV) epidemic requires understanding which system-level policy levers, enacted under the framework of Universal Health Coverage (UHC), are most effective in achieving both transmission reduction and diagnostic coverage. This study addresses an important evidence gap by quantifying the within-country association between measurable UHC policy indicators and the estimated rate of new HIV infections across nine Southeast Asian countries between 2013 and 2022. Employing a Fixed-Effects panel data methodology, the analysis controls for time-invariant national heterogeneity, ensuring reliable estimates of policy impact. We found that marginal changes in total current health expenditure (CHE) as a percentage of gross domestic product (GDP) were not statistically significantly associated with changes in HIV incidence. However, increases in the UHC Infectious Disease Service Coverage Index were statistically significantly associated with concurrent reductions in HIV incidence (p < 0.001), suggesting the efficacy of targeted service implementation as the principal driver of curbing new HIV infections. In addition, the UHC Reproductive, Maternal, Newborn, and Child Health Service Coverage Index exhibited a statistically significant positive association with changes in HIV incidence (p < 0.01), which is interpreted as a vital surveillance artefact resulting from expanded detection and reporting of previously undiagnosed HIV cases. Furthermore, out-of-pocket (OOP) health expenditure as a percentage of CHE showed a counter-intuitive negative association with changes in HIV incidence (p < 0.01), suggesting this metric primarily shows ongoing indirect cost burdens on the established patient cohort, or, alternatively, presents a diagnostic access barrier that results in lower case finding. These findings suggest that policymakers should prioritise investment in targeted infectious disease service efficacy over aggregate fiscal commitment and utilise integrated sexual health platforms for strengthened HIV surveillance and case identification.
McKeown, D. J.; Cruzado, O. S.; Colombo, G.; Angus, D. J.; Schinazi, V. R.
Show abstract
PurposeNavigational ability develops throughout childhood alongside the maturation of brain regions supporting egocentric and allocentric processing. In Autism Spectrum Disorder (ASD), atypical hippocampal development may impact flexible spatial memory; however, findings on navigational ability in autistic children remain inconsistent. This study aimed to compare both objective and perceived navigation ability in children with ASD and typically developing (TD) peers. MethodTwenty-six children with high-functioning ASD and twenty-five age- and gender-matched TD children (M_age = 12.04 years, SD = 1.64) completed a battery of navigational tasks from the Spatial Performance Assessment for Cognitive Evaluation (SPACE), including Path Integration, Egocentric Pointing, Mapping, Associative Memory, and Perspective Taking. Perceived navigation ability was assessed using the Santa Barbara Sense of Direction (SBSOD) scale. ResultsNo significant group differences were observed across any objective navigation tasks. However, children with ASD reported significantly lower perceived navigation ability compared to TD peers. ConclusionThese findings suggest a dissociation between perceived and actual navigational ability in ASD. By early adolescence, objective navigation performance appears intact, potentially reflecting sufficient maturation of underlying neural systems or the presence of compensatory mechanisms. The results underscore the importance of incorporating objective, task-based measures when assessing cognitive abilities in autistic populations.
Xiao, M.; Girard, Q.; Pender, M.; Rabezara, J. Y.; Rahary, P.; Randrianarisoa, S.; Rasambainarivo, F.; Rasolofoniaina, O.; Soarimalala, V.; Janko, M. M.; Nunn, C. L.
Show abstract
PurposeAntibiotic use (ABU) is a major driver of antimicrobial resistance (AMR), but ABU patterns are poorly understood in low-income countries where the burden of AMR is great and ABU is insufficiently regulated. Here, we report ABU from ten sites ranging from rural villages to small cities in Madagascar, a country with high AMR levels, and present results from modeling to identify factors that may be associated with ABU in this setting. MethodsWe conducted surveys of 290 individuals from ten sites in the SAVA Region of northeast Madagascar to gather data on sociodemographic characteristics, agricultural and animal husbandry practices, recent antibiotic use, the antibiotics that participants recalled using in their lifetimes, and the sources of their antibiotics. Using these data, we conducted statistical analyses with a mixed-effects logistic model to determine which characteristics were associated with recent antibiotic use. ResultsNearly all respondents (N=283, 97.6%) reported ABU in their lifetimes, with amoxicillin being the most widely reported antibiotic (N=255, 90.1% of those reporting ABU). All recalled antibiotics were classified as frontline drugs except for ciprofloxacin. Most respondents who reported antibiotic use also reported obtaining antibiotics without prescriptions from local stores (N=273, 96.5%), while only 52.3% (N=148) reported obtaining antibiotics through a prescriptive route, such as from a health clinic or private doctor. Of the 127 individuals (44.9%) who reported recent ABU, men were found to be significantly less likely to have recently taken antibiotics than women. ConclusionsOur findings provide new insights into ABU in agricultural settings in low-income countries, which have historically been understudied in AMR and pharmacoepidemiologic research. Knowledge of ABU patterns supports understanding of AMR dynamics and AMR control efforts in these contexts, such as interventions on inappropriate antibiotic dispensing. Key pointsO_LIAntibiotic use (ABU) in Madagascar is largely unstudied despite its role in antimicrobial resistance (AMR), which Madagascar faces a high burden of. C_LIO_LIABU was widespread among livestock owners in northeast Madagascar, with the majority of study participants reporting ABU in their lifetimes and most people reporting ABU also having taken antibiotics in the previous three months. C_LIO_LIMost respondents reported obtaining their antibiotics from non-pharmaceutical stores, indicating high levels of unregulated ABU, though more than half also reported sourcing their antibiotics through prescriptive means (like doctors and health clinics). C_LIO_LIMen were less likely than women to have taken antibiotics in the previous three months. C_LIO_LIThese findings support the development of interventions to mitigate the burden of AMR in Madagascar and similar contexts while underscoring the need for more comprehensive research on the drivers and patterns of ABU. C_LI Plain language summaryIn this study, we provide basic information on antibiotic use (ABU) patterns in Madagascar, a country that experiences high levels of resistance but has been particularly understudied in AMR and pharmacological research. We surveyed 290 farmers with livestock from ten sites across northeast Madagascar about their ABU and found that nearly all study participants (N=283, 97.6%) have used antibiotics in their lifetimes, while a little under half of those who reported ABU also reported using antibiotics in the previous three months (N=127, 44.9%). The most used antibiotic was amoxicillin (N=255, 90.1%). Most people obtained their antibiotics from sources that do not require prescriptions, like general stores, indicating that most ABU is unregulated. Through modeling, we also found that men were less likely than women to have taken antibiotics in the previous three months (OR=0.50, CI 0.30-0.82). These findings help us better understand the dynamics of ABU in low-income countries, which have historically been understudied in AMR and pharmacological research. They also support efforts to mitigate the burden of AMR by revealing ABU dynamics that may contribute to the emergence and spread of AMR, as well as identifying targets for intervention to curb inappropriate ABU.
Shaetonhodi, N. G.; De Vos, L.; Babalola, C.; de Voux, A.; Joseph Davey, D.; Mdingi, M.; Peters, R. P. H.; Klausner, J. D.; Medina-Marino, A.
Show abstract
BackgroundCurable sexually transmitted infections (STIs), including Chlamydia trachomatis, Neisseria gonorrhoeae, and Trichomonas vaginalis, remain highly prevalent among pregnant women in South Africa. Despite poor diagnostic performance in pregnancy, syndromic management remains standard care. Point-of-care (POC) screening enables aetiological diagnosis and same-visit treatment but is not yet included in national guidelines. We conducted a mixed-methods process evaluation to examine determinants of antenatal POC STI screening implementation in public facilities. MethodsThis evaluation was embedded within the three-arm Philani Ndiphile randomized trial (March 2021-February 2025) across four public clinics in the Eastern Cape. Screening used a near-POC, electricity-dependent nucleic acid amplification test with a 90-minute turnaround time. Reach, Adoption, Implementation, and Maintenance were assessed using the RE-AIM framework. Quantitative indicators included uptake of screening, treatment, and follow-up attendance. Qualitative data included in-depth interviews with 20 pregnant women and five focus group discussions with 21 research staff and government healthcare workers. The Consolidated Framework for Implementation Research guided qualitative analysis. Findings were integrated using narrative weaving. ResultsScreening uptake was high (99.0%), with treatment coverage of 95.2% at baseline and 93.5% at repeat screening. Same-day treatment was lower (50.7% and 69.8%) and varied substantially by facility, reflecting operational constraints including turnaround time, patient volume, infrastructure, and electricity. Attendance was higher when screening was integrated into routine ANC. Women valued screening for infant health, while providers recognised advantages over syndromic management but highlighted workforce, resource, and maintenance constraints. Socioeconomic factors, including transport costs, hunger, and work commitments, influenced retention and waiting. ConclusionsAntenatal POC STI screening was acceptable and achieved high treatment coverage in a research setting. However, same-day treatment was constrained by operational requirements of the testing platform. Scale-up will require workflow integration, strengthened health system capacity, and faster diagnostics suited to routine antenatal care. Key MessagesO_ST_ABSWhat is already known on this topicC_ST_ABSSyndromic management remains standard antenatal care in many low-resource settings despite failing to capture up to 89% of infections that remain asymptomatic. Point-of-care aetiological screening has demonstrated feasibility, acceptability, and potential clinical benefit in research settings, yet has not been widely adopted into national policy. Limited evidence exists on the health system requirements and contextual determinants influencing scale-up within routine public facilities. What this study addsThis mixed-methods process evaluation demonstrates high uptake and treatment coverage of antenatal POC STI screening in a trial setting, while identifying facility-level, structural, and socioeconomic factors shaping same-day treatment and retention. We show that implementation success varies substantially across clinics and depends on assay characteristics, workflow integration, human resources, infrastructure reliability, and follow-up capacity. How this study might affect research, practice or policyThese findings provide implementation-relevant evidence to inform national policy deliberations on integrating POC STI screening into antenatal care. Sustainable scale-up will require context-adapted delivery models, strengthened workforce and supply systems, faster diagnostics, and alignment with existing ANC workflows to ensure equitable and durable impact.
Areb, M.; Huybregts, L.; Tamiru, D.; Toure, M.; Biru, B.; Fall, T.; Haddis, A.; Belachew, T.
Show abstract
BackgroundThis study aimed to assess caregiver knowledge of Infant and Young Child Feeding (IYCF), child health, severe acute malnutrition (SAM) screening, and Community-Based Management of Acute Malnutrition (CMAM), its determinants, and associations with IYCF/ WaSH (water, sanitation, and hygiene) practices among caregivers of children 6-59 months with SAM in Ethiopian agrarian and pastoralist settings. MethodData were from the baseline survey of the R-SWITCH Ethiopia cluster-randomized controlled trial (cRCT), which screened [~]28,000 children aged 6-59 months and identified 686 SAM cases. Caregiver knowledge was evaluated using a validated 32-item questionnaire (Cronbachs for internal reliability) and analyzed via linear mixed-effects and Poisson regression models in Stata 17. ResultsCaregiver knowledge was positively associated with improved IYCF/WaSH practices among children aged 6-23 months with SAM, including higher minimum dietary diversity (MDD: IRR=1.50), minimum acceptable diet (MAD: IRR=1.63), and reduced zero vegetable/fruit intake (IRR=0.77), as well as MDD in children aged 24-59 months, improved water access (IRR=1.19), water treatment (IRR=2.02), and handwashing stations (IRR=1.41). Literate ({beta} = 4.1; 95% CI:1.5-6.6, p= 0.016), pregnant({beta} = 4.4; 95% CI:0.9-7.8, 0.018), having child weighing at a health post/ health center ({beta} = 8.9;95% CI:3.5-14.2,p [≤] 0.001), and higher household wealth index ({beta} = 11.8;95% CI:3.6-20.1,p= 0.005) were associated with higher knowledge, while possible depression ({beta} = -0.3;95% CI: -0.5 to 0.0, p= 0.015) was associated with lower knowledge. ConclusionCaregiver knowledge determines better IYCF/WaSH practices among children aged 6-59 months with SAM. Literacy, pregnancy, having child weighing at a health post or health center, and greater household wealth were associated with caregivers knowledge, whereas possible depression was associated with lower knowledge. Integrating context-specific caregiver education and mental health support into CMAM, GMP(Growth monitoring and promotion), and primary care services could enhance feeding/WaSH practices in Ethiopia.
Heffernan, P. M.; van den Berg, H.; Yadav, R. S.; Murdock, C. C.; Rohr, J. R.
Show abstract
BackgroundInsecticides remain the cornerstone of mosquito vector control for malaria, dengue, and other mosquito-borne diseases, yet global patterns of deployment and their socioeconomic and environmental drivers are poorly characterized. Understanding where and why insecticides are used is essential for better targeting control efforts and ensuring they are effective, equitable, and efficient. MethodsWe analyzed annual country-level insecticide-use data from 122 countries (1990-2019), reported as standard spray coverage for insecticide-treated nets (ITNs), residual spraying (RS), spatial spraying (SS), and larviciding (LA). Generalized linear mixed models and hurdle models quantified associations between deployment and disease incidence, human development index (HDI), human population density, temperature, and precipitation. Models were evaluated using repeated cross-validation and applied to generate downscaled predictions of insecticide use at subnational administrative region level 2 (ADM2) globally. FindingsInsecticide deployment increased with malaria and dengue incidence, but this response was substantially stronger in higher-HDI countries, indicating that deployment depends on socioeconomic capacity as well as disease burden that leads to weaker scaling in lower-resource settings. Intervention types exhibited distinct patterns; ITN use tracked malaria burden, whereas infrastructure-intensive approaches (e.g., RS and SS) were concentrated in higher-HDI settings and increased with Aedes-borne disease incidence. Downscaled ADM2-level maps uncovered substantial within-country heterogeneity that is obscured at the national scale, highlighting regions where predicted deployment remains low relative to disease risk across sub-Saharan Africa, South Asia, and parts of Latin America. InterpretationGlobal insecticide deployment reflects not only epidemiological need but also economic and logistical capacity, creating mismatches between risk and control. High-resolution mapping can support more equitable allocation of interventions, guide insecticide resistance stewardship, and improve strategic planning as climate and urbanization reshape mosquito-borne disease risk.
Maneraguha, F. K.; Cote, J.; Bourbonnais, A.; Arbour, C.; Chagnon, M.; Hatem, M.
Show abstract
Background Comprehensive sexuality education (CSE) is essential to the health and well-being of young people. In the Democratic Republic of Congo (DRC), where more than 65% of the population is under the age of 25, access to interpersonal CSE remains limited owing to sociocultural and structural barriers. This exposes young people to persistent socio-sanitary vulnerabilities. In this context, mobile health apps (MHAs) constitute a promising solution, supported by the growing use of smartphones among young Congolese. However, this group's intention to use MHAs for CSE has been the subject of little research to date. Objective The aim of this study was to identify predictors of intention to use MHAs among young Congolese, based on the extended Unified Theory of Acceptance and Use of Technology (UTAUT2). Methods A predictive correlational study was conducted in eight public secondary schools in Bukavu (DRC) with a stratified random sample of 859 students. Predictors of intention to use--performance expectancy (PE), effort expectancy (EE), social influence (SI), facilitating conditions (FC), and perceived risk (PR)--and moderators--age, gender, and past MHA experience--were measured from data collected through a self-administered UTAUT questionnaire. Descriptive and multivariate analyses were run on SPSS version 28. Results Mean age of participants was 16.3 years (SD = 1.5). Boys made up 55.1% of the sample. Overall, 51.0% of the sample owned a smartphone, of which 62.3% reported having easy access to mobile data and 16.2% were already using MHAs to learn about sexual health. Intention to use MHAs was positively influenced by PE ({beta} = 0.523, p < 0.001), EE ({beta} = 0.115, p < 0.001), and SI ({beta} = 0.113, p < 0.001). FC (p = 0.260) and PR (p = 0.631), however, had no significant influence. Age moderated all of the relationships tested (F (1, 849-854) = 9.97-20.82; p [≤] 0.002), with more marked effects observed among younger participants 14-15 years old. The final model explained 44% of the variance, indicating good predictive power. Conclusion Intention to use digital CSE was explained primarily by PE, EE, and SI and moderated by age. To strengthen this intention, stakeholders will need to promote e-interventions that are pertinent, easy to use, socially valorized, and tailored to young people's needs and to the local context.
Malingumu, E. E.; Badaga, I.; Kisendi, D. D.; Pierre Kabore, R. W.; Yeremon, O. G.; Mohamed, M. A.; He, Q.
Show abstract
This study evaluates the feasibility of implementing artificial intelligence (AI)-driven disease surveillance systems at Julius Nyerere International Airport (JNIA) in Tanzania, a key hub for regional and international travel. Through a mixed-methods approach combining qualitative interviews and quantitative surveys, the research assesses the infrastructure, human resource capacity, and regulatory frameworks necessary for AI integration. Findings indicate that while Port Health Officers are strongly optimistic about AIs potential to enhance disease detection, the airport faces significant barriers, including outdated infrastructure, insufficient technical resources, and a lack of trained personnel. Ethical and privacy concerns, particularly surrounding data security, also emerged as key challenges, compounded by limited public awareness and the socio-cultural acceptability of AI systems. Furthermore, the study identifies gaps in national policies and inter-agency coordination that hinder the effective implementation of AI technologies. The research concludes that while current conditions render AI adoption infeasible, strategic investments in infrastructure, workforce training, and policy development could pave the way for future integration, enhancing public health surveillance at JNIA and potentially other airports in low- and middle-income countries. This study contributes critical insights into the barriers and opportunities for AI-driven disease surveillance in low-resource settings, specifically focusing on a high-priority transit point, international airports. It emphasizes the importance of region-specific solutions to enhance health security in East Africa and supports the broader global health agenda by advocating for international collaboration and the development of scalable disease surveillance systems. Future research should explore pilot AI implementations at other airports to evaluate real-world challenges and refine AI systems for broader applicability, including cost-effectiveness analyses and integration of public perspectives on AI.
Nguyen, D.; ONeill, C.; Akaraci, S.; Tate, C.; Wang, R.; Garcia, L.; Kee, F.; Hunter, R. F.
Show abstract
HighlightsO_LIHealth inequalities have widened over 15 years, favouring high-income groups C_LIO_LIInequality in physical activity & mental health widened the most pre-intervention C_LIO_LIPost-intervention, inequalities persisted but stayed relatively unchanged. C_LIO_LILong-term illness and unemployment were key drivers of inequality C_LIO_LIThe greenway may have slowed down the inequality widening but the impact is limited C_LI BackgroundEvidence concerning health inequalities following urban green and blue space UGBS) interventions is limited. This study examined the changes in health inequalities after a major urban regeneration project, the Connswater Community Greenway (CCG), in Belfast, UK. MethodCross-sectional household surveys were conducted in 2010/11 (baseline), 2017/18 (immediately after completion), and 2023/24 (long-term follow-up) with a sample of approximately 1,000 adults each wave. Using concentration indices (CI), income-related health inequalities for three outcomes (physical activity, mental wellbeing and quality of life) were measured. A regression-based decomposition of concentration index examined the contribution of sociodemographic factors to the observed inequalities underpinning each outcome over time. ResultsAcross three waves, there was widening of inequalities over the 15-year period across all three health outcomes, with those from high-income groups reported higher levels of physical activity (CI=0.33, SE=0.026), better mental wellbeing (CI=0.03, SE=0.003), and better quality of life (CI=0.09, SE=0.008). The widening inequalities mainly occurred during the construction phase of CCG (2010-2017) and remained stable post-intervention (2017-2023). Decomposition analysis revealed that the pro-poor concentration of long-term illness and unemployment was the key driver that together explained approximately 51%-76% of the inequalities. ConclusionThe CCG was limited in reducing health inequalities which were mainly driven by long-term illness and unemployment - factors beyond the direct scope of the UGBS intervention - resulting in low-income groups likely to fall further behind the wealthier groups. The widening of inequality is consistent with findings from other public interventions that did not have a primary equity focus.